Partial least-squares algorithm for weights initialization of backpropagation network

نویسندگان

  • Tzu-Chien Ryan Hsiao
  • Chii-Wann Lin
  • Huihua Kenny Chiang
چکیده

This paper proposes a hybrid scheme to set the weights initialization and the optimal number of hidden nodes of the backpropagation network (BPN) by applying the loading weights and factor numbers of the partial least-squares (PLS) algorithm. The joint PLS and BPN method (PLSBPN) starts with a small residual error, modi6es the latent weight matrices, and obtains a near-global minimum in the calibration phase. Performances of the BPN, PLS, and PLSBPN were compared for the near infrared spectroscopic analysis of glucose concentrations in aqueous matrices. The results showed that the PLSBPN had the smallest root mean square error. The PLSBPN approach signi6cantly solves some conventional problems of the BPN method by providing the good initial weights, reducing the calibration time, obtaining an optimal solution, and easily determining the number of hidden nodes. c © 2002 Elsevier Science B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accurate Initialization of Neural Network Weights by Backpropagation of the Desired Response

Proper initialization of neural networks is critical for a successful training of its weights. Many methods have been proposed to achieve this, including heuristic least squares approaches. In this paper, inspired by these previous attempts to train (or initialize) neural networks, we formulate a mathematically sound algorithm based on backpropagating the desired output through the layers of a ...

متن کامل

Accurate Initialization of Neural Network Weights

Abstracf Proper initialization of neural networks is critical for a successful training of its weights. Many methods have been proposed to achieve this, including heuristic least squares approaches. In this paper, inspired by these previous attempts to train (or initialize) neural networks, we formulate a mathematically sound algorithm based on backpropagating the desired output through the lay...

متن کامل

A Bayesian approach for initialization of weights in backpropagation neural net with application to character recognition

Convergence rate of training algorithms for neural networks is heavily affected by initialization of weights. In this paper, an original algorithm for initialization of weights in backpropagation neural net is presented with application to character recognition. The initialization method is mainly based on a customization of the Kalman filter, translating it into Bayesian statistics terms. A me...

متن کامل

Exchange Rate Forecasting with Ensemble K-PLS and Ensemble Neural Networks: A case study for the Indian Rupee/US Dollar

The purpose of this presentation is to evaluate and benchmark ensemble methods for time series prediction for daily currency exchange rates using ensemble methods based on feedforward neural networks and kernel partial least squares (K-PLS). Ensemble methods reduce the variance on the forecasts and allow for the assessment of confidence metrics and risk for the forecasting model. The use of neu...

متن کامل

Training feedforward networks with the Marquardt algorithm

The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks. The algorithm is tested on several function approximation problems, and is compared with a conjugate gradient algorithm and a variable learning rate algorithm. It is found that the Marquardt algorithm is much more efficient than either...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 50  شماره 

صفحات  -

تاریخ انتشار 2003